On the optimality of the random hyperplane roundingtechnique for MAX CUT ( Draft )

نویسنده

  • Uriel Feige
چکیده

MAX CUT is the problem of partitioning the vertices of a graph into two sets, maximizing the number of edges joining these sets. This problem is NP-hard. Goemans and Williamson proposed an algorithm that rst uses a semideenite programming relaxation of MAX CUT to embed the vertices of the graph on the surface of an n dimensional sphere, and then uses a random hyperplane to cut the sphere in two, giving a cut of the graph. They show that the expected number of edges in the random cut is at least sdp, where ' 0:87856 and sdp is the value of the semideenite program, which is an upper bound on opt, the number of edges in the maximum cut. This manuscript shows the following results: 1. The integrality ratio of the semideenite program is. The previously known bound on the integrality ratio was roughly 0:8845. 2. In the presence of the so called \triangle constraints", the integrality ratio is no better than roughly 0:891. The previously known bound was above 0:95. 3. There are graphs and optimal embeddings for which the best hyperplane approximates opt within a ratio no better than , even in the presence of additional valid constraints. This strengthens a result of Karloo that applied only to the expected number of edges cut by a random hyperplane. 1 The Algorithm of Goemans and Williamson For a graph G(V; E) with jV j = n and jEj = m, MAX CUT is the problem of partitioning V into two sets, such that the number of edges connecting the two sets is maximized. This problem is NP-hard to approximate within ratios better than 16=17 11]. Partitioning the vertices into two sets at random gives a cut whose expected number of edges is m=2, trivially giving an approximation algorithm with expected ratio at least 1=2. For many years, nothing substantially better was known. In a major breakthrough, Goemans and Williamson 9] gave an algorithm with approximation ratio of 0.87856. For completeness, we review their well known algorithm, which we call algorithm GW.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the optimality of the random hyperplane rounding technique for MAX CUT

MAX CUT is the problem of partitioning the vertices of a graph into two sets max imizing the number of edges joining these sets This problem is NP hard Goemans and Williamson proposed an algorithm that rst uses a semide nite programming relaxation of MAX CUT to embed the vertices of the graph on the surface of an n dimensional sphere and then uses a random hyperplane to cut the sphere in two gi...

متن کامل

Improved approximation of Max-Cut on graphs of bounded degree

We analyze the addition of a simple local improvement step to various known ran-domized approximation algorithms. Let ' 0:87856 denote the best approximation ratio currently known for the Max Cut problem on general graphs GW95]. We consider a semideenite relaxation of the Max Cut problem, round it using the random hyperplane rounding technique of ((GW95]), and then add a local improvement step....

متن کامل

On the Minimax Optimality of Block Thresholded Wavelets Estimators for ?-Mixing Process

We propose a wavelet based regression function estimator for the estimation of the regression function for a sequence of ?-missing random variables with a common one-dimensional probability density function. Some asymptotic properties of the proposed estimator based on block thresholding are investigated. It is found that the estimators achieve optimal minimax convergence rates over large class...

متن کامل

The RPR 2 rounding technique for semide nite programsUriel

Several combinatorial optimization problems can be approximated using algorithms based on semideenite programming. In many of these algorithms a semideenite relaxation of the underlying problem is solved yielding an optimal vector connguration v 1 : : : v n. This vector connguration is then rounded into a f0; 1g solution. We present a procedure called RP R 2 (Random Projection followed by Rando...

متن کامل

A New Formulation for Cost-Sensitive Two Group Support Vector Machine with Multiple Error Rate

Support vector machine (SVM) is a popular classification technique which classifies data using a max-margin separator hyperplane. The normal vector and bias of the mentioned hyperplane is determined by solving a quadratic model implies that SVM training confronts by an optimization problem. Among of the extensions of SVM, cost-sensitive scheme refers to a model with multiple costs which conside...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000